40 research outputs found

    Moth olfactory receptor neurons adjust their encoding efficiency to temporal statistics of pheromone fluctuations

    Full text link
    The efficient coding hypothesis predicts that sensory neurons adjust their coding resources to optimally represent the stimulus statistics of their environment. To test this prediction in the moth olfactory system, we have developed a stimulation protocol that mimics the natural temporal structure within a turbulent pheromone plume. We report that responses of antennal olfactory receptor neurons to pheromone encounters follow the temporal fluctuations in such a way that the most frequent stimulus timescales are encoded with maximum accuracy. We also observe that the average coding precision of the neurons adjusted to the stimulus-timescale statistics at a given distance from the pheromone source is higher than if the same encoding model is applied at a shorter, non-matching, distance. Finally, the coding accuracy profile and the stimulus-timescale distribution are related in the manner predicted by the information theory for the many-to-one convergence scenario of the moth peripheral sensory system

    Neuronal jitter: can we measure the spike timing dispersion differently?

    Get PDF

    Information capacity in the weak-signal approximation

    Full text link
    We derive an approximate expression for mutual information in a broad class of discrete-time stationary channels with continuous input, under the constraint of vanishing input amplitude or power. The approximation describes the input by its covariance matrix, while the channel properties are described by the Fisher information matrix. This separation of input and channel properties allows us to analyze the optimality conditions in a convenient way. We show that input correlations in memoryless channels do not affect channel capacity since their effect decreases fast with vanishing input amplitude or power. On the other hand, for channels with memory, properly matching the input covariances to the dependence structure of the noise may lead to almost noiseless information transfer, even for intermediate values of the noise correlations. Since many model systems described in mathematical neuroscience and biophysics operate in the high noise regime and weak-signal conditions, we believe, that the described results are of potential interest also to researchers in these areas.Comment: 11 pages, 4 figures; accepted for publication in Physical Review

    Variability Measures of Positive Random Variables

    Get PDF
    During the stationary part of neuronal spiking response, the stimulus can be encoded in the firing rate, but also in the statistical structure of the interspike intervals. We propose and discuss two information-based measures of statistical dispersion of the interspike interval distribution, the entropy-based dispersion and Fisher information-based dispersion. The measures are compared with the frequently used concept of standard deviation. It is shown, that standard deviation is not well suited to quantify some aspects of dispersion that are often expected intuitively, such as the degree of randomness. The proposed dispersion measures are not entirely independent, although each describes the interspike intervals from a different point of view. The new methods are applied to common models of neuronal firing and to both simulated and experimental data

    Efficient Olfactory Coding in the Pheromone Receptor Neuron of a Moth

    Get PDF
    The concept of coding efficiency holds that sensory neurons are adapted, through both evolutionary and developmental processes, to the statistical characteristics of their natural stimulus. Encouraged by the successful invocation of this principle to predict how neurons encode natural auditory and visual stimuli, we attempted its application to olfactory neurons. The pheromone receptor neuron of the male moth Antheraea polyphemus, for which quantitative properties of both the natural stimulus and the reception processes are available, was selected. We predicted several characteristics that the pheromone plume should possess under the hypothesis that the receptors perform optimally, i.e., transfer as much information on the stimulus per unit time as possible. Our results demonstrate that the statistical characteristics of the predicted stimulus, e.g., the probability distribution function of the stimulus concentration, the spectral density function of the stimulation course, and the intermittency, are in good agreement with those measured experimentally in the field. These results should stimulate further quantitative studies on the evolutionary adaptation of olfactory nervous systems to odorant plumes and on the plume characteristics that are most informative for the ‘sniffer’. Both aspects are relevant to the design of olfactory sensors for odour-tracking robots

    Efficient information transmission and stimulus coding in neuronal models

    No full text
    Non UBCUnreviewedAuthor affiliation: Institute of Physiology, Academy of Sciences of the Czech RepublicOthe

    The effect of inhibition on rate code efficiency indicators

    No full text
    In this paper we investigate the rate coding capabilities of neurons whose input signal are alterations of the base state of balanced inhibitory and excitatory synaptic currents. We consider different regimes of excitation-inhibition relationship and an established conductance-based leaky integrator model with adaptive threshold and parameter sets recreating biologically relevant spiking regimes. We find that given mean post-synaptic firing rate, counter-intuitively, increased ratio of inhibition to excitation generally leads to higher signal to noise ratio (SNR). On the other hand, the inhibitory input significantly reduces the dynamic coding range of the neuron. We quantify the joint effect of SNR and dynamic coding range by computing the metabolic efficiency-the maximal amount of information per one ATP molecule expended (in bits/ATP). Moreover, by calculating the metabolic efficiency we are able to predict the shapes of the post-synaptic firing rate histograms that may be tested on experimental data. Likewise, optimal stimulus input distributions are predicted, however, we show that the optimum can essentially be reached with a broad range of input distributions. Finally, we examine which parameters of the used neuronal model are the most important for the metabolically efficient information transfer. Author summary Neurons communicate by firing action potentials, which can be considered as all-or-none events. The classical rate coding hypothesis states that neurons communicate the information about stimulus intensity by altering their firing frequency. Cortical neurons typically receive a signal from many different neurons, which, depending on the synapse type, either depolarize (excitatory input) or hyperpolarize (inhibitory input) the neural membrane. We use a neural model with excitatory and inhibitory synaptic conductances to reproduce in-vivo like activity and investigate how the intensity of presynaptic inhibitory activity affects the neuron's ability to transmit information through rate code. We reach a counter-intuitive result that increase in inhibition improves the signal-to-noise ratio of the neural response, despite introducing additional noise to the input signal. On the other hand, inhibition also limits the neuronal output range. However, in the end, the actual amount of information transmitted (in bits per energy expended) is remarkably robust to PLOS Computational Biology | https://doi.org/10.1371/journal.pcbi
    corecore